# Information retrieval
Crossencoder Mdebertav3 Base Mmarcofr
MIT
This is a French cross-encoder model based on mDeBERTa-v3-base, specifically designed for passage reranking tasks, demonstrating excellent performance on the mMARCO-fr dataset.
Text Embedding French
C
antoinelouis
111
1
Polish Reranker Base Ranknet
Apache-2.0
Polish text ranking model trained with RankNet loss function, suitable for information retrieval tasks
Text Embedding
Transformers Other

P
sdadas
332
1
Mmlw Retrieval Roberta Base
Apache-2.0
MMLW (I Must Get Better News) is a Polish neural text encoder optimized for information retrieval tasks, capable of converting queries and passages into 768-dimensional vectors.
Text Embedding
Transformers Other

M
sdadas
408
1
Crossencoder Electra Base Mmarcofr
MIT
This is a French cross-encoder model based on the ELECTRA architecture, specifically designed for passage reranking tasks in semantic search.
Text Embedding French
C
antoinelouis
18
0
TILDE
TILDE is a model based on the BERT architecture, mainly used for text retrieval and language modeling tasks.
Large Language Model
Transformers

T
ielab
134
3
Msmarco MiniLM L12 En De V1
Apache-2.0
An English-German cross-lingual cross-encoder model trained on the MS Marco passage ranking task, suitable for passage re-ranking in information retrieval scenarios.
Text Embedding
Transformers Supports Multiple Languages

M
cross-encoder
19.62k
5
Featured Recommended AI Models